Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 15.575
Filtrar
1.
Hum Brain Mapp ; 45(5): e26673, 2024 Apr.
Artigo em Inglês | MEDLINE | ID: mdl-38590248

RESUMO

The amygdala is important for human fear processing. However, recent research has failed to reveal specificity, with evidence that the amygdala also responds to other emotions. A more nuanced understanding of the amygdala's role in emotion processing, particularly relating to fear, is needed given the importance of effective emotional functioning for everyday function and mental health. We studied 86 healthy participants (44 females), aged 18-49 (mean 26.12 ± 6.6) years, who underwent multiband functional magnetic resonance imaging. We specifically examined the reactivity of four amygdala subregions (using regions of interest analysis) and related brain connectivity networks (using generalized psycho-physiological interaction) to fear, angry, and happy facial stimuli using an emotional face-matching task. All amygdala subregions responded to all stimuli (p-FDR < .05), with this reactivity strongly driven by the superficial and centromedial amygdala (p-FDR < .001). Yet amygdala subregions selectively showed strong functional connectivity with other occipitotemporal and inferior frontal brain regions with particular sensitivity to fear recognition and strongly driven by the basolateral amygdala (p-FDR < .05). These findings suggest that amygdala specialization to fear may not be reflected in its local activity but in its connectivity with other brain regions within a specific face-processing network.


Assuntos
Encéfalo , Emoções , Feminino , Humanos , Emoções/fisiologia , Medo/psicologia , Tonsila do Cerebelo/fisiologia , Felicidade , Mapeamento Encefálico/métodos , Imageamento por Ressonância Magnética , Expressão Facial
2.
PLoS One ; 19(4): e0290590, 2024.
Artigo em Inglês | MEDLINE | ID: mdl-38635525

RESUMO

Spontaneous smiles in response to politicians can serve as an implicit barometer for gauging electorate preferences. However, it is unclear whether a subtle Duchenne smile-an authentic expression involving the coactivation of the zygomaticus major (ZM) and orbicularis oculi (OO) muscles-would be elicited while reading about a favored politician smiling, indicating a more positive disposition and political endorsement. From an embodied simulation perspective, we investigated whether written descriptions of a politician's smile would trigger morphologically different smiles in readers depending on shared or opposing political orientation. In a controlled reading task in the laboratory, participants were presented with subject-verb phrases describing left and right-wing politicians smiling or frowning. Concurrently, their facial muscular reactions were measured via electromyography (EMG) recording at three facial muscles: the ZM and OO, coactive during Duchenne smiles, and the corrugator supercilii (CS) involved in frowning. We found that participants responded with a Duchenne smile detected at the ZM and OO facial muscles when exposed to portrayals of smiling politicians of same political orientation and reported more positive emotions towards these latter. In contrast, when reading about outgroup politicians smiling, there was a weaker activation of the ZM muscle and no activation of the OO muscle, suggesting a weak non-Duchenne smile, while emotions reported towards outgroup politicians were significantly more negative. Also, a more enhanced frown response in the CS was found for ingroup compared to outgroup politicians' frown expressions. Present findings suggest that a politician's smile may go a long way to influence electorates through both non-verbal and verbal pathways. They add another layer to our understanding of how language and social information shape embodied effects in a highly nuanced manner. Implications for verbal communication in the political context are discussed.


Assuntos
Fragilidade , Sorriso , Humanos , Sorriso/fisiologia , Leitura , Expressão Facial , Emoções/fisiologia , Músculos Faciais/fisiologia , Pálpebras
3.
Cereb Cortex ; 34(4)2024 Apr 01.
Artigo em Inglês | MEDLINE | ID: mdl-38566513

RESUMO

The perception of facial expression plays a crucial role in social communication, and it is known to be influenced by various facial cues. Previous studies have reported both positive and negative biases toward overweight individuals. It is unclear whether facial cues, such as facial weight, bias facial expression perception. Combining psychophysics and event-related potential technology, the current study adopted a cross-adaptation paradigm to examine this issue. The psychophysical results of Experiments 1A and 1B revealed a bidirectional cross-adaptation effect between overweight and angry faces. Adapting to overweight faces decreased the likelihood of perceiving ambiguous emotional expressions as angry compared to adapting to normal-weight faces. Likewise, exposure to angry faces subsequently caused normal-weight faces to appear thinner. These findings were corroborated by bidirectional event-related potential results, showing that adaptation to overweight faces relative to normal-weight faces modulated the event-related potential responses of emotionally ambiguous facial expression (Experiment 2A); vice versa, adaptation to angry faces relative to neutral faces modulated the event-related potential responses of ambiguous faces in facial weight (Experiment 2B). Our study provides direct evidence associating overweight faces with facial expression, suggesting at least partly common neural substrates for the perception of overweight and angry faces.


Assuntos
Expressão Facial , Preconceito de Peso , Humanos , Sobrepeso , Ira/fisiologia , Potenciais Evocados/fisiologia , Emoções/fisiologia
4.
BMC Psychiatry ; 24(1): 307, 2024 Apr 23.
Artigo em Inglês | MEDLINE | ID: mdl-38654234

RESUMO

BACKGROUND: Obstructive sleep apnea-hypopnea syndrome (OSAHS) is a chronic breathing disorder characterized by recurrent upper airway obstruction during sleep. Although previous studies have shown a link between OSAHS and depressive mood, the neurobiological mechanisms underlying mood disorders in OSAHS patients remain poorly understood. This study aims to investigate the emotion processing mechanism in OSAHS patients with depressive mood using event-related potentials (ERPs). METHODS: Seventy-four OSAHS patients were divided into the depressive mood and non-depressive mood groups according to their Self-rating Depression Scale (SDS) scores. Patients underwent overnight polysomnography and completed various cognitive and emotional questionnaires. The patients were shown facial images displaying positive, neutral, and negative emotions and tasked to identify the emotion category, while their visual evoked potential was simultaneously recorded. RESULTS: The two groups did not differ significantly in age, BMI, and years of education, but showed significant differences in their slow wave sleep ratio (P = 0.039), ESS (P = 0.006), MMSE (P < 0.001), and MOCA scores (P = 0.043). No significant difference was found in accuracy and response time on emotional face recognition between the two groups. N170 latency in the depressive group was significantly longer than the non-depressive group (P = 0.014 and 0.007) at the bilateral parieto-occipital lobe, while no significant difference in N170 amplitude was found. No significant difference in P300 amplitude or latency between the two groups. Furthermore, N170 amplitude at PO7 was positively correlated with the arousal index and negatively with MOCA scores (both P < 0.01). CONCLUSION: OSAHS patients with depressive mood exhibit increased N170 latency and impaired facial emotion recognition ability. Special attention towards the depressive mood among OSAHS patients is warranted for its implications for patient care.


Assuntos
Depressão , Emoções , Apneia Obstrutiva do Sono , Humanos , Masculino , Pessoa de Meia-Idade , Apneia Obstrutiva do Sono/fisiopatologia , Apneia Obstrutiva do Sono/psicologia , Apneia Obstrutiva do Sono/complicações , Depressão/fisiopatologia , Depressão/psicologia , Depressão/complicações , Feminino , Adulto , Emoções/fisiologia , Polissonografia , Potenciais Evocados/fisiologia , Eletroencefalografia , Reconhecimento Facial/fisiologia , Potenciais Evocados Visuais/fisiologia , Expressão Facial
5.
PLoS One ; 19(4): e0301896, 2024.
Artigo em Inglês | MEDLINE | ID: mdl-38598520

RESUMO

This study investigates whether humans recognize different emotions conveyed only by the kinematics of a single moving geometrical shape and how this competence unfolds during development, from childhood to adulthood. To this aim, animations in which a shape moved according to happy, fearful, or neutral cartoons were shown, in a forced-choice paradigm, to 7- and 10-year-old children and adults. Accuracy and response times were recorded, and the movement of the mouse while the participants selected a response was tracked. Results showed that 10-year-old children and adults recognize happiness and fear when conveyed solely by different kinematics, with an advantage for fearful stimuli. Fearful stimuli were also accurately identified at 7-year-olds, together with neutral stimuli, while, at this age, the accuracy for happiness was not significantly different than chance. Overall, results demonstrates that emotions can be identified by a single point motion alone during both childhood and adulthood. Moreover, motion contributes in various measures to the comprehension of emotions, with fear recognized earlier in development and more readily even later on, when all emotions are accurately labeled.


Assuntos
Emoções , Expressão Facial , Adulto , Criança , Humanos , Fenômenos Biomecânicos , Emoções/fisiologia , Medo , Felicidade
6.
BMC Psychiatry ; 24(1): 226, 2024 Mar 26.
Artigo em Inglês | MEDLINE | ID: mdl-38532335

RESUMO

BACKGROUND: Patients with schizophrenia (SCZ) exhibit difficulties deficits in recognizing facial expressions with unambiguous valence. However, only a limited number of studies have examined how these patients fare in interpreting facial expressions with ambiguous valence (for example, surprise). Thus, we aimed to explore the influence of emotional background information on the recognition of ambiguous facial expressions in SCZ. METHODS: A 3 (emotion: negative, neutral, and positive) × 2 (group: healthy controls and SCZ) experimental design was adopted in the present study. The experimental materials consisted of 36 images of negative emotions, 36 images of neutral emotions, 36 images of positive emotions, and 36 images of surprised facial expressions. In each trial, a briefly presented surprised face was preceded by an affective image. Participants (36 SCZ and 36 healthy controls (HC)) were required to rate their emotional experience induced by the surprised facial expressions. Participants' emotional experience was measured using the 9-point rating scale. The experimental data have been analyzed by conducting analyses of variances (ANOVAs) and correlation analysis. RESULTS: First, the SCZ group reported a more positive emotional experience under the positive cued condition compared to the negative cued condition. Meanwhile, the HC group reported the strongest positive emotional experience in the positive cued condition, a moderate experience in the neutral cued condition, and the weakest in the negative cue condition. Second, the SCZ (vs. HC) group showed longer reaction times (RTs) for recognizing surprised facial expressions. The severity of schizophrenia symptoms in the SCZ group was negatively correlated with their rating scores for emotional experience under neutral and positive cued condition. CONCLUSIONS: Recognition of surprised facial expressions was influenced by background information in both SCZ and HC, and the negative symptoms in SCZ. The present study indicates that the role of background information should be fully considered when examining the ability of SCZ to recognize ambiguous facial expressions.


Assuntos
Reconhecimento Facial , Esquizofrenia , Humanos , Emoções , Reconhecimento Psicológico , Expressão Facial , China
7.
Sci Rep ; 14(1): 5574, 2024 03 06.
Artigo em Inglês | MEDLINE | ID: mdl-38448642

RESUMO

Seeing an angry individual in close physical proximity can not only result in a larger retinal representation of that individual and an enhanced resolution of emotional cues, but may also increase motivation for rapid visual processing and action preparation. The present study investigated the effects of stimulus size and emotional expression on the perception of happy, angry, non-expressive, and scrambled faces. We analyzed event-related potentials (ERPs) and behavioral responses of N = 40 participants who performed a naturalness classification task on real and artificially created facial expressions. While the emotion-related effects on accuracy for recognizing authentic expressions were modulated by stimulus size, ERPs showed only additive effects of stimulus size and emotional expression, with no significant interaction with size. This contrasts with previous research on emotional scenes and words. Effects of size were present in all included ERPs, whereas emotional expressions affected the N170, EPN, and LPC, irrespective of size. These results imply that the decoding of emotional valence in faces can occur even for small stimuli. Supra-additive effects in faces may necessitate larger size ranges or dynamic stimuli that increase arousal.


Assuntos
Emoções , Expressão Facial , Humanos , Exame Físico , Ira , Percepção Visual
8.
PLoS One ; 19(3): e0299103, 2024.
Artigo em Inglês | MEDLINE | ID: mdl-38551903

RESUMO

Brain processes associated with emotion perception from biological motion have been largely investigated using point-light displays that are devoid of pictorial information and not representative of everyday life. In this study, we investigated the brain signals evoked when perceiving emotions arising from body movements of virtual pedestrians walking in a community environment. Magnetoencephalography was used to record brain activation in 21 healthy young adults discriminating the emotional gaits (neutral, angry, happy) of virtual male/female pedestrians. Event-related responses in the posterior superior temporal sulcus (pSTS), fusiform body area (FBA), extrastriate body area (EBA), amygdala (AMG), and lateral occipital cortex (Occ) were examined. Brain signals were characterized by an early positive peak (P1;∼200ms) and a late positive potential component (LPP) comprising of an early (400-600ms), middle (600-1000ms) and late phase (1000-1500ms). Generalized estimating equations revealed that P1 amplitude was unaffected by emotion and gender of pedestrians. LPP amplitude showed a significant emotion X phase interaction in all regions of interest, revealing i) an emotion-dependent modulation starting in pSTS and Occ, followed by AMG, FBA and EBA, and ii) generally enhanced responses for angry vs. other gait stimuli in the middle LPP phase. LPP also showed a gender X phase interaction in pSTS and Occ, as gender affected the time course of the response to emotional gait. Present findings show that brain activation within areas associated with biological motion, form, and emotion processing is modulated by emotional gait stimuli rendered by virtual simulations representative of everyday life.


Assuntos
Encéfalo , Magnetoencefalografia , Adulto Jovem , Feminino , Humanos , Masculino , Encéfalo/fisiologia , Emoções/fisiologia , Marcha , Percepção , Potenciais Evocados , Eletroencefalografia , Expressão Facial
9.
Psychol Sci ; 35(4): 405-414, 2024 Apr.
Artigo em Inglês | MEDLINE | ID: mdl-38489402

RESUMO

Ethnic out-group members are disproportionately more often the victim of misidentifications. The so-called other-race effect (ORE), the tendency to better remember faces of individuals belonging to one's own ethnic in-group than faces belonging to an ethnic out-group, has been identified as one causal ingredient in such tragic incidents. Investigating an important aspect for the ORE-that is, emotional expression-the seminal study by Ackerman and colleagues (2006) found that White participants remembered neutral White faces better than neutral Black faces, but crucially, Black angry faces were better remembered than White angry faces (i.e., a reversed ORE). In the current study, we sought to replicate this study and directly tackle the potential causes for different results with later work. Three hundred ninety-six adult White U.S. citizens completed our study in which we manipulated the kind of employed stimuli (as in the original study vs. more standardized ones) whether participants knew of the recognition task already at the encoding phase. Additionally, participants were asked about the unusualness of the presented faces. We were able to replicate results from the Ackerman et al. (2006) study with the original stimuli but not with more standardized stimuli.


Assuntos
Ira , Rememoração Mental , Adulto , Humanos , Reconhecimento Psicológico , Etnicidade , Expressão Facial
10.
IEEE Trans Vis Comput Graph ; 30(5): 2206-2216, 2024 May.
Artigo em Inglês | MEDLINE | ID: mdl-38437082

RESUMO

In Mixed Reality (MR), users' heads are largely (if not completely) occluded by the MR Head-Mounted Display (HMD) they are wearing. As a consequence, one cannot see their facial expressions and other communication cues when interacting locally. In this paper, we investigate how displaying virtual avatars' heads on-top of the (HMD-occluded) heads of participants in a Video See-Through (VST) Mixed Reality local collaborative task could improve their collaboration as well as social presence. We hypothesized that virtual heads would convey more communicative cues (such as eye direction or facial expressions) hidden by the MR HMDs and lead to better collaboration and social presence. To do so, we conducted a between-subject study ($\mathrm{n}=88$) with two independent variables: the type of avatar (CartoonAvatar/RealisticAvatar/NoAvatar) and the level of facial expressions provided (HighExpr/LowExpr). The experiment involved two dyadic communication tasks: (i) the "20-question" game where one participant asks questions to guess a hidden word known by the other participant and (ii) a urban planning problem where participants have to solve a puzzle by collaborating. Each pair of participants performed both tasks using a specific type of avatar and facial animation. Our results indicate that while adding an avatar's head does not necessarily improve social presence, the amount of facial expressions provided through the social interaction does have an impact. Moreover, participants rated their performance higher when observing a realistic avatar but rated the cartoon avatars as less uncanny. Taken together, our results contribute to a better understanding of the role of partial avatars in local MR collaboration and pave the way for further research exploring collaboration in different scenarios, with different avatar types or MR setups.


Assuntos
Realidade Aumentada , 60453 , Humanos , Interface Usuário-Computador , Gráficos por Computador , Expressão Facial
11.
Biol Psychol ; 187: 108771, 2024 Mar.
Artigo em Inglês | MEDLINE | ID: mdl-38460756

RESUMO

The ability to detect and recognize facial emotions emerges in childhood and is important for understanding social cues, but we know relatively little about how individual differences in temperament may influence early emotional face processing. We used a sample of 419 children (Mage = 10.57 years, SD = 1.75; 48% female; 77% White) to examine the relation between temperamental shyness and early stages of emotional face processing (assessed using the P100 and N170 event-related potentials) during different facial expressions (neutral, anger, fear, and happy). We found that higher temperamental shyness was related to greater P100 activation to faces expressing anger and fear relative to neutral faces. Further, lower temperamental shyness was related to greater N170 activation to faces expressing anger and fear relative to neutral faces. There were no relations between temperamental shyness and neural activation to happy faces relative to neutral faces for P100 or N170, suggesting specificity to faces signaling threat. We discuss findings in the context of understanding the early processing of facial emotional display of threat among shy children.


Assuntos
Reconhecimento Facial , Timidez , Criança , Humanos , Feminino , Masculino , Reconhecimento Facial/fisiologia , Emoções/fisiologia , Potenciais Evocados/fisiologia , Ira , Expressão Facial , Eletroencefalografia
12.
Biol Psychol ; 187: 108774, 2024 Mar.
Artigo em Inglês | MEDLINE | ID: mdl-38471619

RESUMO

There has been disagreement regarding the relationship among the three components (subjective experience, external performance, and physiological response) of emotional responses. To investigate this issue further, this study compared the effects of active and passive suppression of facial expressions on subjective experiences and event-related potentials (ERPs) through two experiments. The two methods of expression suppression produced opposite patterns of ERPs for negative emotional stimuli: compared with the free-viewing condition, active suppression of expression decreased, while passive suppression increased the amplitude of the late positive potential (LPP) when viewing negative emotional stimuli. Further, while active suppression had no effect on participants' emotional experience, passive suppression enhanced their emotional experience. Among the three components of emotional responses, facial expressions are more closely related to the physiological response of the brain than to subjective experience, and whether the suppression was initiated by participants determines the decrease or increase in physiological response of the brain (i.e. LPP). The findings revealed the important role of individual subjective initiative in modulating the relationship among the components of emotional response, which provides new insights into effectively emotional regulation.


Assuntos
Regulação Emocional , Expressão Facial , Humanos , Potenciais Evocados/fisiologia , Emoções/fisiologia , Encéfalo/fisiologia , Eletroencefalografia
13.
Physiol Behav ; 278: 114519, 2024 May 01.
Artigo em Inglês | MEDLINE | ID: mdl-38490365

RESUMO

Major functions of the olfactory system include guiding ingestion and avoidance of environmental hazards. People with anosmia report reliance on others, for example to check the edibility of food, as their primary coping strategy. Facial expressions are a major source of non-verbal social information that can be used to guide approach and avoidance behaviour. Thus, it is of interest to explore whether a life-long absence of the sense of smell heightens sensitivity to others' facial emotions, particularly those depicting threat. In the present, online study 28 people with congenital anosmia (mean age 43.46) and 24 people reporting no olfactory dysfunction (mean age 42.75) completed a facial emotion recognition task whereby emotionally neutral faces (6 different identities) morphed, over 40 stages, to express one of 5 basic emotions: anger, disgust, fear, happiness, or sadness. Results showed that, while the groups did not differ in their ability to identify the final, full-strength emotional expressions, nor in the accuracy of their first response, the congenital anosmia group successfully identified the emotions at significantly lower intensity (i.e. an earlier stage of the morph) than the control group. Exploratory analysis showed this main effect was primarily driven by an advantage in detecting anger and disgust. These findings indicate the absence of a functioning sense of smell during development leads to compensatory changes in visual, social cognition. Future work should explore the neural and behavioural basis for this advantage.


Assuntos
Reconhecimento Facial , Transtornos do Olfato/congênito , Humanos , Adulto , Emoções/fisiologia , Medo/fisiologia , Ira/fisiologia , Expressão Facial , Felicidade
14.
Proc Natl Acad Sci U S A ; 121(14): e2313665121, 2024 Apr 02.
Artigo em Inglês | MEDLINE | ID: mdl-38530896

RESUMO

Facial emotion expressions play a central role in interpersonal interactions; these displays are used to predict and influence the behavior of others. Despite their importance, quantifying and analyzing the dynamics of brief facial emotion expressions remains an understudied methodological challenge. Here, we present a method that leverages machine learning and network modeling to assess the dynamics of facial expressions. Using video recordings of clinical interviews, we demonstrate the utility of this approach in a sample of 96 people diagnosed with psychotic disorders and 116 never-psychotic adults. Participants diagnosed with schizophrenia tended to move from neutral expressions to uncommon expressions (e.g., fear, surprise), whereas participants diagnosed with other psychoses (e.g., mood disorders with psychosis) moved toward expressions of sadness. This method has broad applications to the study of normal and altered expressions of emotion and can be integrated with telemedicine to improve psychiatric assessment and treatment.


Assuntos
Transtornos Psicóticos , Esquizofrenia , Adulto , Humanos , Expressão Facial , Emoções , Esquizofrenia/diagnóstico , Medo
15.
Cereb Cortex ; 34(3)2024 03 01.
Artigo em Inglês | MEDLINE | ID: mdl-38466112

RESUMO

Alexithymia is characterized by difficulties in emotional information processing. However, the underlying reasons for emotional processing deficits in alexithymia are not fully understood. The present study aimed to investigate the mechanism underlying emotional deficits in alexithymia. Using the Toronto Alexithymia Scale-20, we recruited college students with high alexithymia (n = 24) or low alexithymia (n = 24) in this study. Participants judged the emotional consistency of facial expressions and contextual sentences while recording their event-related potentials. Behaviorally, the high alexithymia group showed longer response times versus the low alexithymia group in processing facial expressions. The event-related potential results showed that the high alexithymia group had more negative-going N400 amplitudes compared with the low alexithymia group in the incongruent condition. More negative N400 amplitudes are also associated with slower responses to facial expressions. Furthermore, machine learning analyses based on N400 amplitudes could distinguish the high alexithymia group from the low alexithymia group in the incongruent condition. Overall, these findings suggest worse facial emotion perception for the high alexithymia group, potentially due to difficulty in spontaneously activating emotion concepts. Our findings have important implications for the affective science and clinical intervention of alexithymia-related affective disorders.


Assuntos
Sintomas Afetivos , Eletroencefalografia , Humanos , Feminino , Masculino , Expressão Facial , Potenciais Evocados , Emoções
16.
PLoS One ; 19(3): e0300973, 2024.
Artigo em Inglês | MEDLINE | ID: mdl-38512901

RESUMO

OBJECTIVE: Most previous studies have examined emotion recognition in autism spectrum condition (ASC) without intellectual disability (ID). However, ASC and ID co-occur to a high degree. The main aims of the study were to examine emotion recognition in individuals with ASC and co-occurring intellectual disability (ASC-ID) as compared to individuals with ID alone, and to investigate the relationship between emotion recognition and social functioning. METHODS: The sample consisted of 30 adult participants with ASC-ID and a comparison group of 29 participants with ID. Emotion recognition was assessed by the facial emotions test, while. social functioning was assessed by the social responsiveness scale-second edition (SRS-2). RESULTS: The accuracy of emotion recognition was significantly lower in individuals with ASC-ID compared to the control group with ID, especially when it came to identifying angry and fearful emotions. Participants with ASC-ID exhibited more pronounced difficulties in social functioning compared to those with ID, and there was a significant negative correlation between emotion recognition and social functioning. However, emotion recognition accounted for only 8% of the variability observed in social functioning. CONCLUSION: Our data indicate severe difficulties in the social-perceptual domain and in everyday social functioning in individuals with ASC-ID.


Assuntos
Transtorno do Espectro Autista , Transtorno Autístico , Reconhecimento Facial , Deficiência Intelectual , Adulto , Humanos , Transtorno Autístico/psicologia , Interação Social , Deficiência Intelectual/psicologia , Emoções , Transtorno do Espectro Autista/psicologia , Expressão Facial
17.
Neuroimage Clin ; 41: 103586, 2024.
Artigo em Inglês | MEDLINE | ID: mdl-38428325

RESUMO

BACKGROUND: Emotion processing deficits are known to accompany depressive symptoms and are often seen in stroke patients. Little is known about the influence of post-stroke depressive (PSD) symptoms and specific brain lesions on altered emotion processing abilities and how these phenomena develop over time. This potential relationship may impact post-stroke rehabilitation of neurological and psychosocial function. To address this scientific gap, we investigated the relationship between PSD symptoms and emotion processing abilities in a longitudinal study design from the first days post-stroke into the early chronic phase. METHODS: Twenty-six ischemic stroke patients performed an emotion processing task on videos with emotional faces ('happy,' 'sad,' 'anger,' 'fear,' and 'neutral') at different intensity levels (20%, 40%, 60%, 80%, 100%). Recognition accuracies and response times were measured, as well as scores of depressive symptoms (Montgomery-Åsberg Depression Rating Scale). Twenty-eight healthy participants matched in age and sex were included as a control group. Whole-brain support-vector regression lesion-symptom mapping (SVR-LSM) analyses were performed to investigate whether specific lesion locations were associated with the recognition accuracy of specific emotion categories. RESULTS: Stroke patients performed worse in overall recognition accuracy compared to controls, specifically in the recognition of happy, sad, and fearful faces. Notably, more depressed stroke patients showed an increased processing towards specific negative emotions, as they responded significantly faster to angry faces and recognized sad faces of low intensities significantly more accurately. These effects obtained for the first days after stroke partly persisted to follow-up assessment several months later. SVR-LSM analyses revealed that inferior and middle frontal regions (IFG/MFG) and insula and putamen were associated with emotion-recognition deficits in stroke. Specifically, recognizing happy facial expressions was influenced by lesions affecting the anterior insula, putamen, IFG, MFG, orbitofrontal cortex, and rolandic operculum. Lesions in the posterior insula, rolandic operculum, and MFG were also related to reduced recognition accuracy of fearful facial expressions, whereas recognition deficits of sad faces were associated with frontal pole, IFG, and MFG damage. CONCLUSION: PSD symptoms facilitate processing negative emotional stimuli, specifically angry and sad facial expressions. The recognition accuracy of different emotional categories was linked to brain lesions in emotion-related processing circuits, including insula, basal ganglia, IFG, and MFG. In summary, our study provides support for psychosocial and neural factors underlying emotional processing after stroke, contributing to the pathophysiology of PSD.


Assuntos
Depressão , Reconhecimento Facial , Humanos , Estudos Longitudinais , Emoções/fisiologia , Ira , Encéfalo/diagnóstico por imagem , Expressão Facial , Reconhecimento Facial/fisiologia
18.
IEEE Trans Image Process ; 33: 2293-2304, 2024.
Artigo em Inglês | MEDLINE | ID: mdl-38470591

RESUMO

Human emotions contain both basic and compound facial expressions. In many practical scenarios, it is difficult to access all the compound expression categories at one time. In this paper, we investigate comprehensive facial expression recognition (FER) in the class-incremental learning paradigm, where we define well-studied and easily-accessible basic expressions as initial classes and learn new compound expressions incrementally. To alleviate the stability-plasticity dilemma in our incremental task, we propose a novel Relationship-Guided Knowledge Transfer (RGKT) method for class-incremental FER. Specifically, we develop a multi-region feature learning (MFL) module to extract fine-grained features for capturing subtle differences in expressions. Based on the MFL module, we further design a basic expression-oriented knowledge transfer (BET) module and a compound expression-oriented knowledge transfer (CET) module, by effectively exploiting the relationship across expressions. The BET module initializes the new compound expression classifiers based on expression relevance between basic and compound expressions, improving the plasticity of our model to learn new classes. The CET module transfers expression-generic knowledge learned from new compound expressions to enrich the feature set of old expressions, facilitating the stability of our model against forgetting old classes. Extensive experiments on three facial expression databases show that our method achieves superior performance in comparison with several state-of-the-art methods.


Assuntos
Reconhecimento Facial , Humanos , Emoções , Aprendizagem , Expressão Facial , Bases de Dados Factuais
19.
Pediatr Blood Cancer ; 71(6): e30943, 2024 Jun.
Artigo em Inglês | MEDLINE | ID: mdl-38470289

RESUMO

BACKGROUND/OBJECTIVES: Survivors of pediatric brain tumors (SPBT) experience significant social challenges, including fewer friends and greater isolation than peers. Difficulties in face processing and visual social attention have been implicated in these outcomes. This study evaluated facial expression recognition (FER), social attention, and their associations with social impairments in SPBT. METHODS: SPBT (N = 54; ages 7-16) at least 2 years post treatment completed a measure of FER, while parents completed measures of social impairment. A subset (N = 30) completed a social attention assessment that recorded eye gaze patterns while watching videos depicting pairs of children engaged in joint play. Social Prioritization scores were calculated, with higher scores indicating more face looking. Correlations and regression analyses evaluated associations between variables, while a path analysis modeling tool (PROCESS) evaluated the indirect effects of Social Prioritization on social impairments through emotion-specific FER. RESULTS: Poorer recognition of angry and sad facial expressions was significantly correlated with greater social impairment. Social Prioritization was positively correlated with angry FER but no other emotions. Social Prioritization had significant indirect effects on social impairments through angry FER. CONCLUSION: Findings suggest interventions aimed at improving recognition of specific emotions may mitigate social impairments in SPBT. Further, reduced social attention (i.e., diminished face looking) could be a factor in reduced face processing ability, which may result in social impairments. Longitudinal research is needed to elucidate temporal associations between social attention, face processing, and social impairments.


Assuntos
Atenção , Neoplasias Encefálicas , Sobreviventes de Câncer , Emoções , Expressão Facial , Reconhecimento Facial , Humanos , Feminino , Masculino , Criança , Adolescente , Neoplasias Encefálicas/psicologia , Sobreviventes de Câncer/psicologia , Seguimentos
20.
Autism Res ; 17(4): 824-837, 2024 Apr.
Artigo em Inglês | MEDLINE | ID: mdl-38488319

RESUMO

Cumulating evidence suggests that atypical emotion processing in autism may generalize across different stimulus domains. However, this evidence comes from studies examining explicit emotion recognition. It remains unclear whether domain-general atypicality also applies to implicit emotion processing in autism and its implication for real-world social communication. To investigate this, we employed a novel cross-modal emotional priming task to assess implicit emotion processing of spoken/sung words (primes) through their influence on subsequent emotional judgment of faces/face-like objects (targets). We assessed whether implicit emotional priming differed between 38 autistic and 38 neurotypical individuals across age groups as a function of prime and target type. Results indicated no overall group differences across age groups, prime types, and target types. However, differential, domain-specific developmental patterns emerged for the autism and neurotypical groups. For neurotypical individuals, speech but not song primed the emotional judgment of faces across ages. This speech-orienting tendency was not observed across ages in the autism group, as priming of speech on faces was not seen in autistic adults. These results outline the importance of the delicate weighting between speech- versus song-orientation in implicit emotion processing throughout development, providing more nuanced insights into the emotion processing profile of autistic individuals.


Assuntos
Transtorno do Espectro Autista , Transtorno Autístico , Adulto , Humanos , Expressão Facial , Emoções , Transtorno Autístico/psicologia , Julgamento
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA
...